Storage capacity of two-dimensional neural networks
نویسندگان
چکیده
منابع مشابه
Storage capacity of two-dimensional neural networks.
We investigate the maximum number of embedded patterns in the two-dimensional Hopfield model. The grand state energies of two specific network states, namely, the energies of the pure-ferromagnetic state and the state of specific one stored pattern are calculated exactly in terms of the correlation function of the ferromagnetic Ising model. We also investigate the energy landscape around them a...
متن کاملAn Influence of Nonlinearities to Storage Capacity of Neural Networks
The more realistic neural soma and synaptic nonlinear relations and an alternative mean field theory (MFT) approach relevant for strongly interconnected systems as a cortical matter are considered. The general procedure of averaging the quenched random states in the fully-connected networks for MFT, as usually, is based on the Boltzmann Machine learning. But this approach requires an unrealisti...
متن کاملOptimal storage capacity of neural networks at finite temperatures
Gardner's analysis of the optimal storage capacity of neural networks is extended to study finite-temperature effects. The typical volume of the space of interactions is calculated for strongly-diluted networks as a function of the storage ratio α, temperature T , and the tolerance parameter m, from which the optimal storage capacity α c is obtained as a function of T and m. At zero temperature...
متن کاملPhase Diagram and Storage Capacity of Sequence Processing Neural Networks
Abstract. We solve the dynamics of Hopfield-type neural networks which store sequences of patterns, close to saturation. The asymmetry of the interaction matrix in such models leads to violation of detailed balance, ruling out an equilibrium statistical mechanical analysis. Using generating functional methods we derive exact closed equations for dynamical order parameters, viz. the sequence ove...
متن کاملStorage Capacity of Ram-based Neural Networks: Pyramids
Recently the authors developed a modular approach to assess the storage capacity of RAM-based neural networks 1] that can be applied to any architecture. It is based on collisions of information during the learning process. It has already been applied to the GNU architecture. In this paper, the technique is applied to the pyramid. The results explain practical problems reported in the literatur...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Physical Review E
سال: 2001
ISSN: 1063-651X,1095-3787
DOI: 10.1103/physreve.65.016124